Academia.eduAcademia.edu

Tiltification — An Accessible App to Popularize Sonification

2021

This paper presents Tiltification, a multi modal spirit level application for smartphones. The non-profit app was produced by students in the master project “Sonification Apps” in winter term 2020/21 at the University of Bremen to learn how to conceptualize, implement, market and test smartphone apps. In the app, psychoacoustic sonification is used to give feedback on the device’s rotation angles in two plane dimensions, allowing users to level furniture or take perfectly horizontal photos. Tiltification supplements the market of spirit level apps with detailed auditory information presentation. This provides for additional benefit in comparison to a physical spirit level and for more accessibility for visually and cognitively impaired people. We argue that the distribution of sonification apps through mainstream channels is a contribution to establish sonification in the market and make it better known to users outside the scientific domain.

The 26th International Conference on Auditory Display (ICAD 2021) June 25 -28 2021, Virtual Conference TILTIFICATION — AN ACCESSIBLE APP TO POPULARIZE SONIFICATION Malte Asendorf, Moritz Kienzle, Rachel Ringe Fida Ahmadi, Debaditya Bhowmik Jiumeng Chen, Kelly Huynh, Steffen Kleinert Jatawan Kruesilp, Ying Ying Lee, Xin Wang Wei Luo, Navid Mirzayousef Jadid Ahmed Awadin ABSTRACT This paper presents Tiltification, a multi modal spirit level application for smartphones. The non-profit app was produced by students in the master project “Sonification Apps” in winter term 2020/21 at the University of Bremen to learn how to conceptualize, implement, market and test smartphone apps. In the app, psychoacoustic sonification is used to give feedback on the device’s rotation angles in two plane dimensions, allowing users to level furniture or take perfectly horizontal photos. Tiltification supplements the market of spirit level apps with detailed auditory information presentation. This provides for additional benefit in comparison to a physical spirit level and for more accessibility for visually and cognitively impaired people. We argue that the distribution of sonification apps through mainstream channels is a contribution to establish sonification in the market and make it better known to users outside the scientific domain. 1. INTRODUCTION Sonification is well suited [1, 2, 3] but widely neglected [1] as a user interface for human-machine and human-computer interaction. Only a limited number of sonification apps is available for users outside the scientific community. As users are typically inexperienced in the usage of auditory displays, developers will certainly not place sonification in the center of their apps. Furthermore, the idea of sonification as the functional core of tools and apps is not widely known by software developers. Also users may have reservations about perceiving in auditory form only, because they have the lifelong experience on relying on vision as their primary source of information. Typically, they have never used sonification that is more informative than the primitive beeps in their washing machine. To break this vicious circle, some researchers and developers have created apps that make sonification better known and more widely used. This paper reflects on the project “Sonification Apps” in winter term 2020/21 at the University of Bremen. The objective of this project was to develop an app that leverages sonification as its core element. The app is supposed to be practical, intuitive, handy and accessible to people with visual impairment and reading This work is licensed under Creative Commons Attribution Non Commercial 4.0 International License. The full terms of the License are available at http://creativecommons.org/licenses/by-nc/4.0/ Varun Raval, Eve Emily Sophie Schade Hasanur Jaman, Kashish Sharma, Colin Weber Helena Winkler, Tim Ziemer University of Bremen Bremen, Germany [email protected] disabilities. With this app, the project team hopes to contribute to promoting sonification, attracting interest in sonification and overcoming inhibitions in relying on sonification as the primary channel of data communication. In the remainder of this paper, an overview of publicly available sonification apps is given. Furthermore, details on the development of Tiltification are presented. To conclude this paper, how this app can serve as a means to make sonification better known and continuative research topics are discussed. 2. RELATED WORK The idea of a “Killer App” is widely spread within the auditory display community [4, 5, 6], [7, pp. 105f]. The hope is that a certain sonification-centered application will be so easy-to-use, convincing and advantageous that it represents the breakthrough of sonification as a mainstream tool. However, some sonification researchers argue that a killer app is not required for sonification to establish. When enough useful products are brought to the market, sonification will eventually reach the critical mass and become accepted as an alternative or complement of visualization [4, 5]. They refer to sonifications that are already established in their fields, like auditory pulse oximetry [8] and the Geiger Counter [9]. However, such tools are intended for professionals and neither useful nor affordable for private customers. Both, the hope for a killer app and the pragmatic assumption that sonification simply needs more available products, are in stark contrast to the limited number of sonification apps that is available to the public at all. Many apps have been conceptualized or even implemented, but not sustainably released through mainstream channels, like the Google Play Store and Apple App Store [10, 11, 12, 13]. Searching for Keywords like “sonification” and “auditory display” in the Apple App Store yields no results. Searching for “sonification” in the Google Play Store yields 250 apps. However, most of these apps have little to do with sonification. Over the last couple of years, a number of sonification apps has been made available to the public. Some them actively communicate that their informative sound is called “sonification”. On the one hand, these apps serve to provide useful information and functionality to users using sound. On the other hand, they support the auditory display community in advertising and promoting sonification in general, and as a core element of software functionality. The 26th International Conference on Auditory Display (ICAD 2021) June 25 -28 2021, Virtual Conference Figure 1: Physical one-dimensional (left) and two-dimensional (right) spirit levels. When searching for “sonification apps” in Google, you find the following four apps: A both informative and entertaining sonification app is the Amino Acid Synthesizer [14] that makes amino acid sequences audible on android devices. Even though the target group for amino acid sequences seems rather narrow, the Google Play store already counts over 5000 downloads. Quite related is the DNA sonification web app [15] that allows users to explore random, pre- or self-defined DNA sequences aurally. To date the videos the author’s YouTube channel have over 700 views. The multi-platform CURAT Sonification Game [16] is available for Android, Mac, Linux and Windows. It contains five minigames in which players find targets by interpreting and interacting with sonification. Gamification is leveraged to motivate users play conscientiously over a long period of time. To date, the game counts over 500 downloads. The Sonification Sandbox [17] is a cross-platform software that allows the user to load data from CSV-files, plot them and sonify them on a MIDI basis. The app has a Graphical User Interface (GUI) multiple pre-defined mappings and several options to tune the sonification and to export it as a MIDI-file. 3. THE TILTIFICATION APP In this paper, Tiltification is presented, a mobile device application that embodies a spirit level – also known as carpenter’s level, mechanic’s level, water level, bubble level, torpedo level, mason’s level or tubular level. These devices are clinometers that indicate their own deviations from horizontal alignment visually, through the position of a bubble in a liquid. They work in one or two dimensions as demonstrated in Figure 1. Tiltification enhances the inherent core functionality of measuring deviations from ideal leveling by giving auditory feedback in addition to a visual display. By doing so, sonification becomes a central component and crucial merit. While numerous physical spirit levels and spirit level apps for mobile devices are commercially available, Tiltification is the first one to indicate alignment in two dimensions in a multi-modal way using graphics, text and sonification. The app is available for iOS and Android on Apple’s and Google’s app stores. In contrast to other acoustic spirit level apps, like Bo Kalvslund’s Spirit Level with Sound and NixGame’s https://play.google.com/store/apps/details?id=org.nixgame.bubblelevel the sonification in Tiltification informs the user how far to tilt the smartphone along each angle to achieve perfect leveling. 3.1. Implementation To realize Tiltification as a mobile app, the following components had to be researched on, designed, implemented, tested and even- Figure 2: Architecture of the application. tually integrated: 1. receive and pre-process sensor data 2. auditory feedback based on the “psychoacoustic sonification” [18] 3. visual feedback The visual and auditory tilt level feedback represent the two essential features for the users. While the visual feedback is implemented with core functionality, the auditory feedback of Tiltification is realized in Pure Data. Both their backgrounds and realizations are further discussed in the sections 3.2 and 3.3. In order to provide both display formats with live data, the application continuously listens to an accelorometer event dispatcher and applies filtering and normalization algorithms to determine the tilt angles. Depending on the device’s operating system, the app either calls the Objective-C (iOS) or the Java (Android) implementation of Pure Data (libpd). Both libraries are embedded. Visual and auditory level feedback is adjusted to reflect the degree to which the phone is tilted. The diagram in Figure 2 illustrates the architecture of the application. 3.1.1. Auditory implementation with Pure Data Pure Data (PD) is a visual programming environment to create audio, video and graphical output by the use of a graphical interface rendering the need to write code. In PD, different elements (atoms) are used to represent specific functions, e.g. to receive data and to generate output. Atoms can exchange data and be interconnected. A PD program, also called a PD patch, is a simple encoded file with a .pd extension which can be interpreted by different libraries like libpd for audio generation. The functionality is comparable to the principles of electronic audio or music creation. Oscillators and various effects can be created, combined and manipulated as objects in a patch representing an individual program. As a library, PD is embeddable into various other environments[19, p. 5ff]. In contrast to PD, libpd has no audio drivers, no MIDI drivers, no user interface and no thread synchronization. In addition to PD as core library, libpd includes language bindings and utilities for different programming languages like Java, Objective-C and others [20]. These bindings are used in order to make the functionality of libpd available in the corresponding target languages [21, p. The 26th International Conference on Auditory Display (ICAD 2021) 43ff]. In our application we use the Java1 and Objective-C2 API of the libpd library to interact with it and generate audio dynamically in real time. 3.1.2. Flutter To develop the app, Flutter was selected as framework, as it allows programming of cross-platform apps. This saves the additional work of developing native applications in different programming languages for both Android and Apple smartphones [22, 23]. Flutter uses skia, its very own rendering engine to render views instead of relying on web wrappers [24] which provides for a very good performance compared to other solutions [23]. Flutter was released in 2018 by Google and is still being further developed and improved [22, 23]. This provides for stability and further support in the future compared to frameworks developed by other smaller companies. It also has hundreds of non-Google contributors who provide a big number of various useful libraries. Those make Flutter a good choice for a multitude of projects [22]. Other frameworks were inspected beforehands. For example React Native, the probably best supported and most used alternative to Flutter, shares a lot of the cross-platform capabilities, but lacks scalability and performance [24]. Also other qualities in contrast make Flutter a better choice. Furthermore, prototyping with React Native showed that its audio capabilities do not fulfil the requirements for the sonification part of the project. 3.1.3. Language support To make Tiltification useful to as many users as possible, the app is distributed in four languages: English, German, Spanish and Chinese. This also concerns most of the textual information available: • in-app information, e.g. buttons, menus, imprint etc. • app store information, e.g. app description • help and FAQ resources These localizations could be realized within the project team. June 25 -28 2021, Virtual Conference interface, it was important to create a minimalist, clean, and easyto-understand interface so that users of the application can easily read the measurements. As accessibility is considered a dooropener for the acceptance of sonification apps [4, 25] we incorporated and accessible app design for people with visual impairments, dyslexia, and color blindness. 3.2.2. Visual Impairments The color palette was inspired from typical colors of a spirit level such as green, yellow, white, and black. Additional attention was paid to ensure that there was sufficient contrast between the colors. The three most common types of color blindness include deficiency in seeing red-green colors (i.e. Deuteranopia, Deuteranomaly), blue-yellow colors (i. e. Tritanopia, Tritanomaly), and all colors (complete color-blindness) [26]. A red-green color vision deficiency affects approximately 8% of men and 0.5% of women of a Northern European origin [27]. It was important that the colors were not too similar, so that it would be accessible for users with color blindness or visual impairments [26]. Furthermore, in order to support users with dyslexia, it was important to use single color backgrounds such a white or a dark grey [28]. For example, the proposed color scheme of the user interface consists of using a light green in the foreground (#59DC97) and a dark grey in the background (#313843). This results in a contrast ratio of 6.81 : 1, and therefore passes the user interface contrasts for Web Content Accessibility Guidelines AA [29]. Other user friendly features were also considered, such as the function to turn off the sound and the option for saving offset measurements. Although the visual feedback is not necessarily helpful for blind users, a visual aid is needed for the usability of others. Since the sonification tones are unfamiliar to many users and therefore not self-explanatory, especially in the beginning, the visual representation helps to learn the meaning of the sounds. While the menu of the app might lack in accessibility for blind users, the main functionality of the app (leveling objects) can be used without having to access the menu at all. It was decided to keep the basic functions easy to access without having to select anything and the automatic mode switching according to natural positions – two axes when the phone is on its back, one axis when the phone is upright – makes it easy to select the right mode. 3.2. Design 3.2.3. Dyslexia Although Tiltification mainly uses sound to convey the angle, a visual interface was created so that the application appeals to users and provides visual cues in addition to the sounds. 3.2.1. Accessibility The interface was designed in an iterative process with the users in mind. Prior to the design, research was conducted to determine who potential users of the application could be and what their needs are. For this, an analysis of different competitor apps and a comparison of the user experience was conducted first. A total of 14 similar apps from both the Android and iOS store were considered, in addition to various analog spirit levels. Additionally, different hypothetical personas were developed to represent the different user groups. When developing the user 1 https://github.com/libpd/pd-for-android 2 https://github.com/libpd/pd-for-ios Studies revealed that dyslexia may occur in 10% of the population [30], newer studies estimate that 15% of the Indian populations has a form of dyslexia [31]. Consequently, our app design focuses on iconic depictions rather than text. We chose a font without serifs, avoid italics, all-caps, use large font size structure text with bold and large headlines, keep text passages short and use short terms [32, p. 281],[33, p. 649]. 3.2.4. Evaluation from Accessibility Applications In addition to evaluating the color contrast of the application based on Web Content Accessibility Guidelines, the Tiltification application has been reviewed under the Accessibility Engine Application and Googles Accessibility Scanner for Android. The feedback received from the two accessibility applications were minor changes which included labelling the object in the Android code and making the icon sizes to be at least 44x44 pixels – all of which has been addressed appropriately in development. The 26th International Conference on Auditory Display (ICAD 2021) June 25 -28 2021, Virtual Conference 3.2.5. User Interface Throughout the development process, two user interface designs were created. They are shown in Fig. 3. Prototype A displayed the tilt angle and direction indicated by a simple line visual representation, and Prototype B demonstrated a liquid effect when the surface was levelled – note that the values in the prototypes are for demonstrative purposes only and are not reflective of actual measurements. Figure 5: After the loading animation of the logo, the following three screens display the purpose and use of this application. Figure 3: Prototype A (left) and B (right) In order to better understand which prototype was best suited for the use case, user interviews were conducted with simple clickthrough prototypes. Four participants were interviewed which included two doctoral students, one musicologist, and a safety aerospace engineer. Furthermore, the interviews were structured in two parts – one part allowed the participants to freely explore the prototype, and the second part included a task where the participants had to place the prototype as if they were levelling an object (i. e. table). When the participants were asked to rate how easy it was to navigate through the prototypes (“On a scale of 1-5, 5 being really easy, how easy was it to navigate through the application?”), both prototypes received the same arithmetic mean of 4.5. Afterwards, the prototypes were consulted internally based on feasibility, and it was decided to move forward with Prototype B, as a dark screen tends to be more energy efficient and less fatiguing to the eyes. Furthermore the placement of the menu buttons in Prototype B provided a better User Experience as they could be easier reached and would not be accidentally hit while using the app. Figure 4: Final Design in One Axis Mode (left) and Two Axes Mode (right). Figure 6: Graphical user interface of the app. Large, minimalistic text, graphical depictions, discriminable colors and make the app accessible to people with visual impairment and reading disabilities. Although consideration of users with visual impairments and dyslexia were taken into account, it is important to note that further improvements can always be made to address accessibility issues. One example to combat this is that it is worth exploring other color combinations beyond green and grey [26]. 3.2.6. How the Application Works When the user first opens the application, there is a loading animation of the logo that occurs. Afterwards, the application displays a three-swipe onboarding, allowing the user to preview the purpose and usages of this application, specifically that the mobile application is an all-in-one measuring tool and targeted for at-home projects (i.e. levelling tables and shelves). These initial screens can be seen in Fig. 5 After the on-boarding and loading screen, the user is taken to the main user interface which can be seen in Fig. 6. The white plus symbol on the black background represents the target angle. The green circle and the gray circle indicate an approximate distance to the target angle and merge in a playful matter when this angle is reached. At this point the larger circle also turns green. In addition to the psychoacoustic sonification and the graphical representation, the larger of the two smartphone angles is displayed as a large text in yellow. The menu shows iconic depictions in yellow. The purpose of consolidating the menu on the bottom is place the buttons within the reach of the thumb. From left to right, they refer to audio, orientation, preset, and additional information. The 26th International Conference on Auditory Display (ICAD 2021) audio: Clicking on the sound menu icon either mutes the sound, makes it audible when Tiltification is in the foreground, or makes it audible when Tiltification is either in the foreground or in the background. This way the Tiltification sonification stays audible, e.g., while using a camera or a compass app. orientation: The icon for settings on the orientation is a padlock. Clicking the padlock icon opens the settings for the orientation. Here, the user can choose whether the leveling is supposed to be one-dimensional, two-dimensional, or adaptive to the current smartphone angles. Furthermore, the user can choose between portrait orientation, landscape orientation, and adaptive orientation. If non-adaptive modes are selected, the icon becomes a locked padlock. preset: The icon for presets is a gear that appears to adjust an elevation angle. After clicking the icon, users can define and save an offset-angle, for example to level something at 15◦ instead of horizontal level. additional information: The icon for additional information is a question mark. Here, users can find our legal notice, an FAQ, contact information and a link to a Google Forms user survey. Clicking the eye icon on the top of the screen displays or hides a small text that indicates the x- and the y-angle individually. All set options like the sound, mode, orientation or presets are saved when closing the application for the next use. As an additional aid to the user, all icons show a description on a longer press on them. The application renounces inaccessible design elements, like multiple color shades, textures, gradual color or brightness transitions. In the future, a tutorial is going to be implemented to explain the sonification as well as the functions of buttons to the user. In a practical use case, the application can be used when the user places the phone on a one dimensional surface (i.e. such as on a picture frame) which will ideally display the spirit level user interface (see left image in Fig. 4) or a two dimensional surface (i.e. on top of a table) which will display the bubble level interface (see right image in Fig. 4). The user can see and hear whether or not the surface is levelled or is tilted on an angle. Furthermore, the user can see and hear specifically where the user would have to adjust in order to appropriate level the table. 3.3. From sensors to sound In order to provide the visual and auditory feedback of Tiltification with live data, mobile device sensors are rendered useful. Smartphones and tablets contain various sensors, e.g. cameras or microphones. A little less obvious are sensors to identify spatial orientation, e.g. to distinguish landscape and portrait mode or, in this case, identify lying at rest and parallel to the surface of the earth. Depending on the model, at least one or a combination of sensors is implemented. Research on such sensors has been extensive in the fields of aerospace and navigation. In practical use, they are designed as Micro-Electro-Mechanical-Systems (MEMS) to transfer the originally bulky technology into a scale that fits into modern electronic devices [34, p. 2004]. These miniature mechanical and electronic elements can go from a microns to several millimetres vary from simple structures to complex electro-mechanical systems with various elements. Most smartphones do feature a MEMS as an Inertial Movement Unit (IMU), in detail accelerometers, gyroscope and sometimes magnetometer that essay an important part in the orientation of the phone. June 25 -28 2021, Virtual Conference 3.3.1. Accelerometer Accelerometers measure the rate of change of velocity of an object in correspondence to its reference frame in real time by determining the inertial force affecting a proof mass. For example, it measures gravity as 1 g being the acceleration of the earth facing upwards which is unexpected, since a movement towards the earth is considered to involve acceleration. Following Newtons laws, an object at rest in a gravity field obtains an upward acceleration. One accelerometer determines force along one axis. To measure the forces along three axes, three accelerometers are combined. A three-dimensional force vector represents its measurement results. Applied to developing a spirit level app, only the gravity force in all three axes is relevant, since the phone is assumed to not experience any other acceleration force by lying at rest[35, 80ff], [36, 2258f]. Accelerometers experience errors, e.g. bias error, when the measured 0 g deviates from the ideal 0 g. The measured signal tends to be noisy, e.g. due to electronic noise from the circuitry. Due to the fact that accelerometers are installed widely in all current smartphone models [37] and deliver quite reliable information on position in relation to space, the visual and auditory feedback of Tiltification rely on its measurements. To provide for an implementation that is equally usable on all devices and to avoid greater measurement deviations, the following filtering methods are applied. 3.3.2. Filtering Smartphone sensors continuously deliver analogue data that are converted to discrete digital values. Filters assist to receive a sensor signal that is as little noisy as possible. In a very simple form, high- and lowpass filters reduce the signal to a specified spectrum above or below a certain threshold. To reduce the effect of sudden changes, a sample rate can be applied to the signal. In Tiltification, a low-pass filter is implemented by weighting the latest raw sensor value with 80 percent, and the previous one with 20 percent. Sensor events are processed at the maximum rate the phone offers, so there is no accurate sample rate limitation. More elaborate filters, such as the Complementary and the Kalman filter, enable so-called sensor fusion and to take another smartphone sensor into account thereby. Following this idea, two sensors are combined to cancel out most flaws. While the Complementary filter provides for less processing delay due to the more simple calculations, the more complex Kalman filter might serve better results [34, p. 2005f]. A gyroscope can serve as a secondary sensor, as it provides information about changes in the tilt angle calculated from the accelerometer measurements. Since gyroscopes are not installed in all smartphones, some sources mention 50% [37], others only about 28% [38], plans on implementing sensor fusion were discarded and would have only been picked up again in case sufficient results were not achieved with less complex filters. 3.3.3. Sound creation The filtered and normalized accelerometer data are the basis for calculating tilt angles. In PD, these serve as input to an implementation that produces a specific sound spectrum for each direction. Following the theory of symmetrical axes as in aeronautics, pitch and roll are used to identify rotations around the two horizontal axes. Positive and negative rotations for both axes are indicated by an acoustic feedback with a recognizable sound design. The 26th International Conference on Auditory Display (ICAD 2021) June 25 -28 2021, Virtual Conference Most participants use a spirit level to set up furniture (64%), to do construction work (54%) and to hang pictures and painting (49%). Furthermore notable at about 30% each were tile work and TV setup. Reviewing other spirit level apps on the market, it could be determined that campers use those to level their caravan. Other use cases discussed in the project team where • taking perfectly horizontal pictures without the need for a graphical overlay that occludes parts of the camera display • balancing tablets to wait tables • balancing a barbell during weight-lifting Figure 7: The two tilt angles. Altering the pitch angle (green) in either direction lets the audible pitch rise or fall continuously. Altering the roll angle (purple) makes the sound either rougher or lets its loudness fluctuate periodically. Due to the fact that these movements are orthogonal to each other and occur simultaneously, the sound design has to convey information on the rotations distinguishable. The following sounds were implemented, in accordance with [39]: • • • • positive pitch angle: audible pitch decreases negative pitch angle: audible pitch rises positive roll: loudness fluctuates negative roll: sound becomes rough All sound attributes become faster/more intense the greater the deviation is from the ideal leveling. The sonification can be explained using Fig. 7. A horizontal phone creates a smooth, complex tone with steady pitch and steady loudness. A positive pitch angle (green) lets the audible pitch fall continuously. The larger the angle the faster the audible pitch falls. A negative pitch angle lets the audible pitch rise continuously. The larger the angle the faster the audible pitch rises. At positive roll angles (purple) the sound becomes rough. The larger the angle the rougher the sound. At negative roll angles the loudness fluctuates periodically. The larger the angle the faster the loudness fluctuation. If the values of both freedom degrees are between 0◦ and 2◦ off, an additional pink noise indicates closeness to the target level. Experiments with interactive sonification confirmed that the two dimensions are perceived as orthogonal [18]. Participants found sonified targets as small as 0.03% of a sonified space, which implies that over 3000 different locations may be interpretable by interactive users. Such a precision is much higher than the precision of the acceleration sensors and of conventional spirit levels that use a bubble in a liquid. 3.4. Use Cases A survey was conducted to reveal the usefulness and potential use cases for such an app. In total, 67 persons took part in the survey. Half of the participants were students, 40% office workers, 10% in other occupations. In terms of gender distribution, male and female participated in equal shares, while 3% chose not to say. About 64% have already used a spirit level, 46% of the participants don’t own one. This is consistent with the fact that 31% of the participants stated to do home improvement rather seldom and very few had a construction background. The remaining 54% either owned a physical spirit level or a smartphone app (25%). 3.5. Marketing In order to popularize the app and to inform about the project, a team within the project focused on marketing and the production of content for social media, but also got engaged around creating informational graphical and video material, e.g. how-to videos to explain the app use or general explanations on sonification. The following channels were addressed: • Instagram3 • Facebook4 • Project website5 While the website gives more basic information on the project, the social media channels were used to inform about news, backgrounds on implementing the app and fun aspects. All in all, the produced contents focus on explaining Tiltification to the user and sonification in general. Furthermore, two surveys were conducted by the marketing team in order to receive important feedback, one, as already mentioned, at the begin and another one accompanying the beta test phase. Marketing over social media channels is a paradigm that evolved quite a few years ago. Driven by the simple fact that customers developed a new form of informational emancipation towards the usual ways products were advertised by sharing insights about product, the trust of people in advertisements by companies decreased ([40], p. 11). Therefore, using social media became a better way to stay in touch with customers. An advantage for the distribution of information about own products is the low cost of participation in social media platforms. Publishing content is without charge for many platforms, other than advertisement in television and print media. This is a chance for non-profit projects, like the subjected project of this paper, to reach out to many potential users of a product without the need to provide financial resources. The goal of using social media for the Tiltification app is therefore to get the attention of as many potential users as possible. Since the app is free of charge, the probability of getting people interested in trying the app is likely high. Unfortunately, we are facing the cold start problem [41], i.e., our new user accounts have no existing network and, therefore, a limited audience and only some dozens downloads. However, the CURAT Sonification game [16] developed in another master’s project achieved almost 500 downloads after a press release by the University. Therefore, we believe that having an established social media presence start 3 https://www.instagram.com/tiltification/ 4 https://www.facebook.com/tiltification/ 5 https://tiltification.uni-bremen.de The 26th International Conference on Auditory Display (ICAD 2021) paying off after we’ve created some attention with another press release. 4. CONCLUSION AND FUTURE WORK In this paper we introduced Tiltification, a free, sonificationcentered spirit level app. Tiltification provides users with multi modal information about the tilt angle of the smartphone, which enables them to level furniture, take perfectly horizontal photos etc. We argue that more free and useful sonification apps should be distributed over mainstream channels in order to establish sonification as a part of the user interface besides visualization. We describe how we implemented the app, how we realized and examined accessibility, and how we advertise and distribute our app. Tiltification is released for iOS in the Apple App Store and for Android in the Google Play Store. Apart from that, a free APK is available on the website. The source code will be made available publicly under an open license soon in order to support further development by the community. Apart from that, the project team is working on tutorials that include use cases and sound examples, so that users understand the app and get used to the sound and its meanings. We did our best to create a useful and appealing app. However, we know that there is room for improvement. To keep it simple, Tiltification is limited to pitch and roll angles, i.e., two degrees of freedom. Acoustically, we can implement the yaw angle leveraging our three-dimensional sonification [42]. However, this makes the sonification even more complicated to interpret and additional functions interfere with out light-weight menu structure. Furthermore, a purely auditory menu could make the app perfectly accessible to blind users. But this concept could make the app less accessible to sighted people. Likewise, a hybrid visual/auditory menu could be confusing. 5. REFERENCES June 25 -28 2021, Virtual Conference [6] A. Supper, “Sublime frequencies: The construction of sublime listening experiences in the sonification of scientific data,” Social Studies of Science, vol. 44, no. 1, pp. 34–58, 2014. [7] K. Bijsterveld, Sonic Skills. Listening for Knowledge in Science, Medicine and Engineering. Palgrave Macmillan, 2019. [8] M. Watson and P. Sanderson, “Sonification supports eyesfree respiratory monitoring and task time-sharing,” Human Factors, vol. 46, no. 3, pp. 497–517, 2004. [9] E. Rutherford and H. Geiger, “An electrical method of counting the number of α particles from radioactive substances,” Proceedings of the Royal Society (London). Series A, vol. 81, no. 546, pp. 141–161, 1908. [10] S. Landry, Y. Sun, D. Slade, and M. Jeon, “Tempo-fit heart rate app: Using heart rate sonification as exercise performance feedback,” in 22nd International Conference on Auditory Display (ICAD2016), Canberra, July 2016. [Online]. Available: http://hdl.handle.net/1853/56567 [11] D. Avissar, C. Leider, C. Bennett, and R. Gailey, “An audio game app using interactive movement sonification for targeted posture control,” in Proc. 19th International Conference on Auditory Display (ICAD2013), Lodz, July 2013. [Online]. Available: http://hdl.handle.net/1853/51640 [12] T. Hermann, O. Höner, and H. Ritter, AcouMotion – An Interactive Sonification System for Acoustic Motion Control, ser. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer, 2006, vol. 3881, pp. 312–323. [13] J. Fan and S. Topel, “Sonictaiji: A mobile instrument for taiji performance,” in Proc. 20th International Conference on Auditory Display, 2014. [Online]. Available: http: //hdl.handle.net/1853/52096 [14] C.-H. Yu, Z. Qin, F. J. Martin-Martinez, and M. J. Buehler, “A self-consistent sonification method to translate amino acid sequences into musical compositions and application in protein design using artificial intelligence,” ASC Nano., vol. 23, no. 13, pp. 7471–7482, 2019. [1] C. Frauenberger, T. Stockman, and M.-L. Bourguet, “A survey on common practice in designing audio in the user interface,” in Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but Not as We Know It, ser. BCS-HCI ’07. Swindon, GBR: BCS Learning & Development Ltd., 2007, p. 187194. [15] M. D. Temple, “An auditory display tool for dna sequence analysis,” BMC Bioinformatics, vol. 18, p. paper number 221, 2017. [2] T. Hermann, A. Hunt, and J. G. Neuhoff, “Introduction,” in The Sonification Handbook, T. Hermann, A. Hunt, and J. G. Neuhoff, Eds. Berlin: COST and Logos, 2011, ch. 1, pp. 1–6. [Online]. Available: http://sonification.de/handbook [16] T. Ziemer and H. Schultheis, “The curat sonification game: Gamification for remote sonification evaluation,” in Proc. 26th International Conference on Auditory Display (ICAD2021), Gainesville, FL, June 2021. [3] T. Ziemer, N. Nuchprayoon, and H. Schultheis, “Psychoacoustic sonification as user interface for humanmachine interaction,” International Journal of Informatics Society, vol. 11, no. 3, 2020. [Online]. Available: http://www.infsoc.org/journal/vol11/11-3 [17] B. K. Davison and B. N. Walker, “Sonification sandbox reconstruction: Software standard for auditory graphs,” in Proc. 13th International Conference on Auditory Display, Montreal, June 2007. [Online]. Available: http://hdl.handle. net/1853/50030 [4] M. A. Nees, “Auditory graphs are not the “killer app” of sonification, but they work,” Ergonomics in Design, vol. 26, no. 4, pp. 25–28, 2018. [18] T. Ziemer and H. Schultheis, “Psychoacoustic auditory display for navigation: an auditory assistance system for spatial orientation tasks,” Journal on Multimodal User Interfaces, vol. 13, no. 3, pp. 205–218, Sept. 2019. [5] J. G. Neuhoff, “In sonification doomed to fail?” in Proc. 25th International Conference on Auditory Display, Newcastle upon Tyne, 2019, p. 3 pages. [Online]. Available: http://hdl.handle.net/1853/61531 [19] T. Hillerson, Programming sound with pure data, ser. Pragmatic programmers. Dallas, TX: The Pragmatic Programmers, LLC, 2014, oCLC: ocn861211571. The 26th International Conference on Auditory Display (ICAD 2021) [20] P. Brinkmann, G. Inc, P. Kirn, R. Lawler, C. Mccormick, M. Roth, and H.-c. Steiner, “Embedding Pure Data with libpd (2011),” in in Proceedings of the Pure Data Convention, Weimar, Weimar, 2011. [Online]. Available: https://doi.org/10.1.1.475.2510 [21] P. Brinkmann, Making Musical Apps: Real-time audio synthesis on Android and iOS, 1st ed. O’Reilly, 2012. [22] R. Payne, Hello Flutter. Berkeley, CA: Apress, 2019, pp. 3–8. [Online]. Available: https://doi.org/10.1007/ 978-1-4842-5181-2 1 [23] D. Meiller, Moderne App-Entwicklung mit Dart und Flutter: Eine umfassende Einführung. Berlin, Boston: De Gruyter Oldenbourg, 05 May. 2020. [Online]. Available: https://doi.org/10.1515/9783110690651 [24] M. Rodriguez-Sanchez Guerra, “Cross-platform development frameworks for the development of hybrid mobile applications: Implementations and comparative analysis,” 10 2018. [Online]. Available: http://hdl.handle.net/10498/20951 [25] B. N. Walker and M. A. Nees, “Theory of sonification,” in The Sonification Handbook, T. Hermann, A. Hunt, and J. G. Neuhoff, Eds. Berlin: COST and Logos, 2011, ch. 2, pp. 9–39. [Online]. Available: http://sonification.de/handbook/ [26] S. Aytac et al., “Using color blindness simulator during user interface development for accelerator control room applications,” in International Conference on Accelerator and Large Experimental Control Systems, 2017, pp. 1958–1968. [27] S. Deeb, “The molecular basis of variation in human color vision,” Clinical genetics, vol. 67, no. 5, pp. 369–377, 2005. [28] British Dyslexia Association, “Dyslexia friendly style guide,” 2018. [Online]. Available: https://www.bdadyslexia.org.uk/advice/ employers/creating-a-dyslexia-friendly-workplace/ dyslexia-friendly-style-guide [29] Web Content Accessibility Guidelines (WCAG) 2.0, web, Web Content Accessibility Guidelines Working Group Recommendation, Dec. 2008. [Online]. Available: http: //www.w3.org/TR/WCAG20/ [30] A. W. Ellis, Reading, Writing and Dyslexia. chology Press Ltd, 1993. Hove: Psy- [31] Y. Navya, S. SriDevi, P. Akhila, J. Amudha, and C. Jyotsna, “Third eye: Assistance for reading disability,” in Soft Computing and Signal Processing, V. S. Reddy, V. K. Prasad, J. Wang, and K. T. V. Reddy, Eds. Singapore: Springer Singapore, 2020, pp. 237–248. [32] J. Tidwell, Designing Interfaces. Sebastopol, CA: O’Reilly, 2006. [33] W. O. Galitz, The Essential Guide to User Interface Design. An Introduction to GUI Design Principles and Techniques, 3rd ed. Indianapolis, IN: Wiley, 2007. [34] P. Gui, L. Tang, and S. Mukhopadhyay, “Mems based imu for tilting measurement: Comparison of complementary and kalman filter based data fusion,” in 2015 IEEE 10th Conference on Industrial Electronics and Applications (ICIEA), 2015, pp. 2004–2009. June 25 -28 2021, Virtual Conference [35] P. Corke, Robotics, Vision and Control, ser. Springer Tracts in Advanced Robotics. Cham: Springer International Publishing, 2017, vol. 118. [Online]. Available: http: //link.springer.com/10.1007/978-3-319-54413-7 [36] W. Tao, T. Liu, R. Zheng, and H. Feng, “Gait Analysis Using Wearable Sensors,” Sensors, vol. 12, no. 2, pp. 2255–2283, Feb. 2012. [Online]. Available: http: //www.mdpi.com/1424-8220/12/2/2255 [37] P. Naiya, “Sensors in Smartphones to Top 10 Billion Unit Shipments in 2020,” Jan. 2021. [Online]. Available: https://www.counterpointresearch.com/ sensors-smartphones-top-10-billion-unit-shipments-2020/ [38] O. Milinov, “GSMArena,” Jan. 2021. [Online]. Available: https://www.gsmarena.com/results.php3?chkGyro=selected [39] T. Ziemer and H. Schultheis, “A psychoacoustic auditory display for navigation,” in 24th International Conference on Auditory Displays (ICAD2018), Houghton, MI, June 2018. [Online]. Available: http://doi.org/10.21785/icad2018.007 [40] M. H. Ceyp and J.-P. Scupin, Social Media Marketing - ein neues Marketing-Paradigma? Wiesbaden: Gabler, 2011, pp. 9–19. [Online]. Available: https://doi.org/10.1007/ 978-3-8349-6593-6 1 [41] Z. Ye, D. Zhang, H. Zhang, R. P. Zhang, X. Chen, and Z. Xu, “Cold start on online advertising platforms: Data-driven algorithms and field experiments,” SSRN (Preprint), Oct. 2020. [Online]. Available: http://dx.doi.org/10.2139/ssrn. 3702786 [42] T. Ziemer and H. Schultheis, “Psychoacoustical signal processing for three-dimensional sonification,” in 25th International Conference on Auditory Displays (ICAD2019), Newcastle, June 2019. [Online]. Available: https://smartech. gatech.edu/handle/1853/61499