Here is the design that I created in response to the user feedback.

Similar documents
PSP Rhythm User s Manual

User Manual. for ArcTrax UWP version

Working with Apple Loops

Push. Figure A4.1 Push.

User Manual. Drum Machine

Contents. Overview...3. Song Editor Clip Editor Browser and Rytmik Cloud Keyboard Controls Support Information...

COMP519 Web Programming Lecture 8: Cascading Style Sheets: Part 4 Handouts

1. Welcome to System 9 Pattern Sequencer

Client-Side Web Technologies. CSS Part II

Blaze Audio Karaoke Sing-n-Burn

An Honors Thesis (HONRS 499) Thesis Advisor Rui Chen. Ball State University Muncie, Indiana. Expected Date of Graduation

Joomla! extension JSN EasySlider User Manual

Advanced CSS. Slava Kim

JSN EasySlider Configuration Manual

LiveProfessor 2. User Manual. Rev audiostrom.com

Support Notes (Issue 1) September Snap it! Certificate in Digital Applications (DA105) Coding for the Web

Creating Buttons and Pop-up Menus

Scythe Synthesizer for iphone

Frontend guide. Everything you need to know about HTML, CSS, JavaScript and DOM. Dejan V Čančarević

READSPEAKER ENTERPRISE HIGHLIGHTING 2.5

Graduating to Grid. An Event Apart Orlando

Keynote 08 Basics Website:

ver 1.61 User Manual

Introduction: -The Beat Kangz

CSS Selectors. element selectors. .class selectors. #id selectors

COMSC-031 Web Site Development- Part 2

Page Layout. 4.1 Styling Page Sections 4.2 Introduction to Layout 4.3 Floating Elements 4.4 Sizing and Positioning

echo collective ORION - user guide

STRANDS AND STANDARDS

Chaos Culture. MIDI Modulators / Multiclip Note preview 1.6. Edited by Jason Cowling

Header. Article. Footer

Presents: PowerPoint 101. Adapted from the Texas State Library s TEAL for All Texans Student Resources Manual

CUSTOMER PORTAL. Custom HTML splashpage Guide

Discovering the Mobile Safari Platform

Using Sitecore 5.3.1

Selection tool - for selecting the range of audio you want to edit or listen to.

COPYRIGHTED MATERIAL. Lesson 1

Portable Music Studio

Index LICENSED PRODUCT NOT FOR RESALE

Contact at Once Widget..5. Contact Us Widget...5 Contact Info Widget. 6 Dealership Contacts Widget. 6

The Kurzweil K2000 & Galaxy Intro: Phase One: Phase Two:

Aeolian Meditation. User Manual. All that we see or seem is but a dream within a dream. - Edgar Allan Poe

ORB Education Quality Teaching Resources

FORMS. The Exciting World of Creating RSVPs and Gathering Information with Forms in ClickDimensions. Presented by: John Reamer

InDesign UX Design Patterns. by Justin Putney

Working with Sound in PowerPoint

INSTALLATION. UNDERSCORE is a sample library for Native Instruments Kontakt 5.3 and later. It will NOT work with the free Kontakt Player.

Agent and Agent Browser. Updated Friday, January 26, Autotask Corporation

For today, choose Format: NTSC Rate: Use: DV-NTSC Anamorphic (North American, widescreen)

Videos...31 Training Videos...32 Webinar recording: Monday 5th December

YuJa Enterprise Video Platform WCAG 2.0 Checklist

Sounds of the Delta Kontakt instrument.

EasyStart. EMX1 Main Features. EasyStart CONTENTS

EmbeddedSPARK 2010 Round 2 Description

PSQ-1684 Operation Manual

User Manual. Administrator s guide for mass managing VirtueMart products. using. VM Mass Update 1.0

Before you begin, make sure you have the images for these exercises saved in the location where you intend to create the Nuklear Family Website.

PS04 Pattern Editor. Last Update: Feb,

XMReality 6. User Manual for Windows XMReality AB Teknikringen 10, 8 fl SE Linköping Sweden

COMPUTER DESCRIPTION...

Inear Display AMALGAME

MintySynth Software Manual v. 4.2

I. Introduction General Introduction Features Requirements Design View Device View... 4 II.

Workshop. Automation ÂØÒňΠMV-8000

Using The Akai MPC With Ableton Live

User Manual. Page-Turning ebook software for Mac and Windows platforms

Welcome Installation Authorization The T-Pain Engine: Quick Start Interface Sound Check... 10

Introduction to HTML & CSS. Instructor: Beck Johnson Week 5

Styles, Style Sheets, the Box Model and Liquid Layout

ABOUT THIS COURSE... 3 ABOUT THIS MANUAL... 4 LESSON 1: MANAGING LISTS... 5

MV-8800 Production Studio

Instructions for Web Content Creators and Web Editors Web Transformation design extensions

INTRODUCTION TO SAMPLING 1

Table Basics. The structure of an table

Web Design and Implementation

UI Elements. If you are not working in 2D mode, you need to change the texture type to Sprite (2D and UI)

Home Concert Xtreme 3 for ios User Guide

Dreamweaver: Web Forms

ACCUFAST. LPX Printer Printing Guide

16A CSS LAYOUT WITH FLEXBOX

Conversations at the Kitchen Table Anna Bartlett, Ben Papp, Spencer Perry, Jake Zaslav

User Interfaces for Web Sites and Mobile Devices. System and Networks

IAB Digital Video Filmstrip Digital Video Rising Star Ad Unit. Style Guide and Tech Specs

Tips and Techniques for Designing the Perfect Layout with SAS Visual Analytics

MorphEdit for Windows. Copyright 1994,1995,1996 PJA White

Notes - CSS - Flexbox

MV-8000 Production Studio. When asked So what s up with the new MV-8000? Here are the opening points to hit The MV-8000 has:

FPDJ. Baltazar Ortiz, Angus MacMullen, Elena Byun

Mastertracks Pro 4 Review

User Guide. FingerBeat 2.0. manual version 1.1

Windows 10: Part 2. Updated: May 2018 Price: $1.80

Ryan Parsons Chad Price Jia Reese Alex Vassallo

Creating Content with iad JS

SAINT LOUIS DREAMA Page 2 of 12

and close the various Traktor Panels. Setup: click on the setup icon to open the setup dialog. Close: clicking this icon will close TRAKTOR DJ Studio.

Integrating Facebook. Contents

Collaborate in Qlik Sense. Qlik Sense February 2018 Copyright QlikTech International AB. All rights reserved.

Eastern Percussion Module

XMReality 6. User Manual for Windows XMReality AB Teknikringen 10, 8 fl SE Linköping Sweden

SH-2. PLUG-OUT Software Synthesizer Owner s Manual

Transcription:

Mobile Creative Application Development Assignment 2 Report Design When designing my application, I used my original proposal as a rough guide as to how to arrange each element. The original idea was to create an application called Growlr, which consisted of 2 oscillators with various effects in which the user manipulated to generate sounds. This concept still exists within my application; however, it is based a lot less around the idea of a pure synthesis application. Before moving onto the production of the application I decided to consult some friends and family about the design. Most of them liked the original idea, but when asked, 70% of them said is there any percussion to go along with it?. From here I decided that the application was going to need some sort of percussion element. I decided that the best way to implement this on a small screen, was to incorporate a sequencer. For the design, I took some inspiration from a piece of software called Drumbit (João Santos, 2017). Below is the original and new idea for the main interface. Here is the design that I created in response to the user feedback.

As can be seen, the designs are very different in comparison to one another, however some elements remain such as the play, pause and stop buttons in the top right hand corner as well as the synthesis element. As the user feedback suggested that I should still keep the synthesizer element, I designed a screen that can be swiped down to, that housed the synthesizer elements (below). From here the user would be able to choose different wave shapes associated with the oscillators as well as change the semi tone and octave, as mentioned in the original proposal. The amplitude of each oscillator can also be controlled, to allow for greater control over the sound being produced. Below is the design for the octave and semi tone pages/ overlays.

Once I had designed the oscillator page view, I went on to design the other pages/ overlays. In my original proposal, I was against the idea of implementing a midi keyboard due to the lack of space available of the display of mobile devices. However, after consulting with friends and family about the design, it was apparent that I was going to need a way of allowing the user to play along to the percussion beat created on the previous page. To achieve this, I designed a midi keyboard that could be accessed by pressing the keyboard button at the top of the page. Below is how the design of the keyboard would roughly look.

I took the same approach for the other sections on the top navigation bar. Below is the filter page/ overlay. Below is the delay page/ overlay.

And finally, the distortion page/ overlay.

Once I had a rough idea of the design of the application, I had to complete some research into which algorithms I was going to need in order for the application to function correctly. To do this I split the application sections into different manageable tasks as shown below. The Sequencer o How to play samples at the correct time o How to get samples to play if a certain box/ step is clicked o How to change the color of a box/ step if it has been clicked on o How to adjust the tempo o How to adjust the swing o How to connect sequencer output to speakers o How to associate these parameters to controls The Synthesizer o How to create oscillators o How to change attributes of the oscillators o How to route sound of the oscillators o How to connect the oscillators to the keyboard o How to associate these parameters to controls The Keyboard o How to create the buttons of the keyboard o How to associate the buttons on the keyboard with midi note numbers o How to connect the keyboard to the oscillators The Effects o How to create a delay o How to create distortion o How to create a filter o How to route the audio through the effects o How to associate these parameters to the controls o Logarithmic range slider To create the application, I decided to use HTML, CSS and Javascript. In my original proposal, I stated that I would be using Swift. However, due to accessibility in regards to the technology, I decided that web technologies would be better suited. As a result, I was able to tailor my research into the above areas, towards the web technology. In my research, I found the I would be able to make use of web-audio (Mozilla, 2017), which is built into Javascript. This would allow me to create oscillators, as well as the effects with relative ease. I also discovered a library called MCAD (Dr J Ferraris, 2017). This library allows users to be able to create pattern schedulers, which I believed would allow me to create the sequencer with relative ease. With a small amount of research into the field completed, I decided that the best way to grasp an understanding of the technology, was to start creating the application.

Implementation When creating the GUI I used 4 main elements within the HTML. <button> <table> <div> <input type= range> First I had to create a div that would be used as the surface of the entire application. Inside the surface div, I needed to create a container div that would fill out to all four corners of the surface. This would then allow me to place other div s within the container div. Here is the code needed to achieve this. Within the container div I created two other div s called top and bottom. I then split the top div into further sections, which would then be used to house the controls at the top of the sequencer design. Below is screenshot of the code needed to achieve this. As can be seen above I have filled these div s with HTML elements, that allow the user to interact with the sequencer such as the play, pause and stop buttons, as well as some range controls/ sliders.

Once I had created the top part of the sequencer, I moved down to the bottom div and created a div to house the sequencer and then filled the div with a table element. In this case, the name of the parent div is called sequencer and the table div is housed within in. Each td element represents 1 cell within the table. Each td element also has an attribute called data-step-xxx which has a number attached to it. This is used later on within the Javascript to identify each square within the table.

Within the bottom div there are 2 more div s, which house the range sliders for the volume, as well as labels for each channel of the sequencer, which can be seen below.

This was the first page of the application complete. I then went on to create the second page within the application which housed the synthesiser. To do this I created another div which inherited from the same class as the first surface. In turn this created a second page underneath the first, which could be scrolled down to. As the dimensions of the surface adhered to the aspect ratio of 16:9, each surface filled the entire display of the device. Within the second surface was another container which housed all of the elements within the synthesiser, including buttons, images and ranges/ sliders, using the same technique as above. However, on this page I created div s called overlays. The purpose of the overlays was to act like a page within the parent page. This meant that I could show and hide elements of the synthesiser without them needing to be constantly on show. To do this I created an overlay div and filled it with HTML elements such as ranges/ sliders, buttons and labels, as can be seen below. In order for the overlays to operate properly they needed to be positioned at the end of the HTML document, otherwise elements of the surface would be shown underneath the overlay, which is not the intended outcome. As can be seen each element within my HTML code has its own div, usually defined as xxxarea. This could be seen as inefficient, however in order for the display to be completely responsive, I found that this was the best method. Using elements such as HTML lists would be a lot easier and less time consuming, however I found that the text would shift out of position when scaling to a larger or smaller size, which was causing an issue on smaller screen devices.

Once I had all of the HTML laid out in the correct location, I had to use CSS to style the actual page. First, I needed to define the area of the surface. As mentioned above I needed to use the aspect ratio in order for it to scale correctly on mobile devices. To do this, within the CSS, I had to set the width of the surface to 100%. I then needed to use a CSS attribute called padding-bottom with a percentage of 56.25%. This allowed the surface to be scaled in accordance to the 16:9 ratio. To get 56.25%, I needed to divide (height) 1080 (p) by (width) 1920 and times by 100. This in turn gives the percentage needed to scale the surface accordingly. I then needed to fill the surface with the container. This was done by giving the container a position of absolute and defining the left, right, bottom and top as 0. This pushes out the container to fill the entire area of the surface.

As I had created an area for each element in the HTML in order for it to scale correctly, I had to give each element an id which allowed me to then take the same approach as above. Instead of filling the div with another div however, it would be filled with the HTML element (button, range/ slider, label). For this to work, the parent element needs to have a position of relative. This then allows the element to be assigned a position of absolute, which will then allow the top, bottom, left and right to be assigned the value of 0, which can be seen below. I took this approach for many of the elements within my application which can be seen in my CSS code. I also experimented a little bit with Flexbox (W3Schools, 2017) in some areas of the CSS, which allowed me to position items centrally within a div, without needing to define the left, right top and bottom. To do this the parent is set to display: flex which allows the child to have a margin of auto.

Once I had created and styled my design/ layout, I needed to add some functionality to the application. To do this I used Javascript along with JQuery. In order for the HTML to recognise the Javascript, I needed to import it at the bottom of the HTML document, as well as the MCAD and JQuery libraries, as can be seen below. As mentioned above I split the algorithm section of the application up into manageable parts. I started with the sequencer as this seemed the most time-consuming task. I first needed to create an audio context and a scheduler. To gain an insight into how I would go about creating the sequencer, I looked at some sample code from John Ferraris (John Ferraris, 2017). The sample code included a sequencer which loaded samples from an online source and stored them, ready for playback. The samples were then loaded into a buffer and played when the step was on. However, it was only for one track and I needed to have 6 separate channels. I therefore decided to dissect the code and implement changes step by step, until I was able to create a 6 channel version of the original code. The code started with an array which would store whether or not a certain box within the sequencer table had been clicked.

I then needed to create an if statement that would look to see if the pattern within the array is on, and if so load the sample into the buffer for playback when the play head reaches its position. In the example below the source of audio is connected to a gain node which is for its channel and that gain node is connected to a second gain node, that is used for the master volume. The master volume is then connected to the audio context destination, which is the sound card on each device. Next, was to create the function that was activated when the cell within the table was clicked. The code below shows the click handler for the table cells. First, variables are created for each channel. Then, a few more variables are created to get the data-step value as mentioned above from the HTML. Finally, the 2 variables are used to toggle on and off the sample within the array, which was defined at the start.

Next, was to create the synthesiser. To do this I followed a guide from (Mozilla, 2017) which explained how to create an oscillator and the attributes that can be assigned to the oscillator. Below is the code needed to create an oscillator with a sine wave. From here the oscillator could be connected to the audio context destination which would send it to the output. However, as I wanted the oscillator to be controlled by a keyboard, so I needed to route the audio elsewhere.

When an oscillator is created, it can only be started once. This meant that in order to create the sense of a note on and note off, a gain node is needed. By using a gain node an event handler can be attached to a button, which would bring the gain node up to full volume, when the button is clicked, and go back to mute when the button is released. Below is a screenshot of the code required to do so. The code below also takes the value associated with the HTML buttons, called data-midi. The value of data-midi is the midi value of that particular key. The data-midi attribute can then be manipulated by the range sliders in the octave and semi tone section of the synthesiser, which in turn changes the midi value entering the mcad.midinotetohz section of the code. The value is then sent to the oscillator which in turn plays the note associated with that key.

I then needed to create a keyboard using buttons, which I could then assign different midi numbers to. Below is a screenshot of the buttons created in the HTML document and their associated midi numbers.

Once I had the synthesiser playing when different keys were pressed, I moved onto the effects section of the synthesiser. In web audio, effects are easy to implement and only a small amount of audio routing is needed in order for them to function correctly. For the filter, it was as simple as creating a biquad filter node and associating certain attributes to the filter. I decided to create controls to allow the users to manipulate different aspects of the filter. Below is the implementation of the biquad filter node. I then assigned that attributes of the filter, such as the filter frequency and the filter type, to buttons and a range/ slider. When creating the range/ slider I decided to use a logarithmic scale, to allow for better control over the filter. This can be seen in the screen shot below which adds a logarithmic scale to the range/ slider between 200 and 20000.

Implementing the delay was similar to the filter whereby the delay effect could be accessed in web audio by simply creating a delay node. When creating the delay, I also needed to create a bypass and feedback gain node. This was to allow for the delay to repeat the sound at a later point in time. From here is was a matter of routing the audio between the delay node, feedback gain node and the bypass gain node, before being sent to the audio context destination. I took the same approach the filter, whereby I assigned the attributes of the delay to ranges/sliders to allow the user to control the delay attributes.

Finally, was to create the distortion module and again the same approach was taken. However, for the distortion, an algorithm was needed in order to create an intercepting wave which would then create the distortion. The algorithm came from (Mozilla, 2017), which creates a sine wave shape and stores it within an array. The amount of distortion that is applied to the sound is dependent on the number that enters the array. Therefore, I have given the user several options which changes the amount of distortion that is applied to the output, by changing the number that enters into the array, through the press of a button.

User Testing When it came to testing my application, I was lucky enough to have an android device to deploy to. I was also able to ask a few friends to complete some testing on their android devices too. All of the people who I deployed my application to, had no experience in music technology. I asked them to use the application and tell me how easy they thought it was to use. Most replied saying it was simple and clear to understand and use. However, some of them explained that they did not understand the synthesis part of the application, but liked the drum kit. As none of them were from a musical background, I was happy for them to not completely understand how this particular element worked. As the application has a single view and uses overlays instead of being directed to separate pages, the response is very quick. As the application is technically only on one page, this makes all of the available options to the user visible from the start. This means that there are no hidden items or hard to reach controls, as they are all displayed straight away. I have also created controls that are large and spacious to allow the user to control the parameters easily even on a smaller screen. This was also something that was mentioned by the users who were testing the application.

One of the biggest changes to the application, was the removal of the synthesiser channel on the sequencer. The original idea was to use a similar idea to my original proposal, which consisted of creating the growl in the synthesiser, and then using the sequencer to play the growl back. However, the user feedback suggested that the synthesiser section of the sequencer was not being used, due to the lack of control. Because of this I decided to remove this section, which can be seen in the final product. When I deployed the application to my friend s android devices for the first time, I was told that the sequencing section of the application was running very slow and the beats were out of time. At first I thought that this issue may have been because of the phones that they were using not having enough processing power. However, after close analysis of the issue, I found that the reason for this, was the scheduler not being able to look ahead quick enough. To solve this issue a small piece of code needed to be added to the scheduler, in order to give the scheduler a longer time to look ahead. Below is a screen shot of the code that needed to be added to the scheduler instance.

During the user testing, I also was made aware of an issue that did not allow the users to play the midi keyboard. Every now and again the keyboard would play a note, but it would cut out shortly after. At first I thought this may have been an issue with the scheduler/ the performance specs of the devices. However, this issue was apparent throughout all devices. On the desktop simulation tests, I could not replicate this issue, as it was only happening on the devices. After adding a break point in the code deployment, it was made clear that the code responsible for clicks within the Javascript, was not having the same effect on the device as it was within the browser. The mouse down event handler, was not able to process the tap and hold gesture on the midi keyboard, when on a mobile device. In turn, the code responsible for making the keys sound, was not being triggered. To remedy this bug, I had to use a separate gesture recogniser, which was developed especially for mobile devices, which in turn recognises the tap and hold, and allows the notes to be played back correctly. I have also included the mouse down and mouse up events, as this allows the application to be used within the desktop browser as well as mobile devices.

Analysis Overall I believe that the final application was a success. There are many strengths to the application, such as the simple user interface as well being an all in one instrument. The ability to play keyboard notes alongside the sequencer percussion, I believe is the biggest strength that differentiates the application from competition, which usually only focus on one aspect, such as sequencing or synthesis. Although the application has many strengths, it also has a few weaknesses. On the midi keyboard, the user is only able to play one note at a time, due the synthesizer only being monophonic. Implementing a polyphonic synthesizer could be something to look at in a future application update. The midi keyboard also has an ever so slight delay when pressing the keys, due to the way in which the bind touchstart method works. There is a way around this with an external library which tackles the delay, called Hammer Time. However, this was something that I was not able to successfully implement. This could also something to consider looking into in a future update. Compared to the original proposal, the application has changed and expanded considerably. However, I believe that the overall product is a lot more user friendly and lot more feature rich, in comparison. In terms of the MOSCOW analysis, the application fulfills all of the must haves and combats a few should have items. As for any future improvement of the application, I would like to address the issues mentioned above in relation to the keyboard. I would also like to address some of the could have items in the MOSCOW analysis such as the ability to upload and have creations ranked on a social platform. However, instead of a Growl the uploaded creation would now be an 8 bar loop of percussion and synthesis.

References João Santos, 2017. Drumbit. Available from: https://www.pluraldev.com/drumbit/ [Accessed 13 December 2017]. Mozilla, 2017. Web Audio API. Available from: https://developer.mozilla.org/en- US/docs/Web/API/Web_Audio_API [Accessed 13 December 2017]. Dr J Ferraris, 2017. MCAD.libraary. Available from: http://drjferraris.github.io/mcad.library/ [Accessed 13 December 2017]. W3Schools, 2017. CSS3 Flexible Box. Available from: https://www.w3schools.com/css/css3_flexbox.asp [Accessed 15 December 2017]. John Ferraris, 2017. Multiple Tracks - Plunkr. Available from: https://embed.plnkr.co/yfon8ezmavgdpn4upfhk/ [Accessed 15 December 2017]. Mozilla, 2017. OscillatorNode. Available from: https://developer.mozilla.org/en- US/docs/Web/API/OscillatorNode [Accessed 15 December 2017]. Mozilla, 2017. Waveshaper. Available from: https://developer.mozilla.org/en- US/docs/Web/API/WaveShaperNode [Accessed 15 December 2017].