Week 12: Final Project Progress

This week I focused on getting the mobile app and API to communicate with each other.  In order to access the input fields from the user I made unique cells in the collection view for each question.  I then used a Singleton class to store the current work session based on the inputs from all of the view controllers.  Once I had this working I added a post request to the API when the user clicks the submit button at the end of their work session.  I then turned my attention to some of the background variables that I wanted to save in the API for each work session.  First I added a location variables so that, with permission, each work session stores the latitude and longitude of the users location.  Then I added a sound variable.  With permission, the app will record the sound level of location when you start a work session and send that value to the API. Figure 1 shows the current state of the app.

I also conducted some user testing and got some great feedback on my UI that I would like to address in the coming week (Figure 2).

Figure 2

Figure 2

It was most apparent that there are some issues with the collection view.  Users did not know to scroll to answer the remaining questions.  I added a “swipe to the right to continue” label on the first screen of each questionnaire section to fix this problem.  Also, the scroll feature is too sensitive.  Users swipe to the right too aggressively and end up scrolling past some of the questions and having to scroll backwards.  I am now thinking that a button instead of a scroll feature is a better solution. I also got some great feedback about the colors and the design.  Users said that the screens were too similar.  I plan to use color to differentiate between the begin work session questionnaire and the end work session questionnaire.  I also plan to make the colors a bit more bright and vibrant. 

I also added a weather variable to the API that collects the weathers from the dark sky API.  

This coming week I plan to make these small UI adjustments and add some animations to the app to improve the user experience. 

Week 11: Final Project Progress

This week I focused on getting the basic app architecture setup.  I set up all of the view controllers and main action flow.  Figure 1 shows the resulting app setup.

After that I began to work through the data structure of the app and work through the the problem of storing one work session during one flow through the app.  A link to the current code can be found here

Week 10: Final Project Progress

This week I focused on getting the API set up for  my application and verifying that I could make the proper API requests with swift.  Using javaScript I created an API structure that uses basic authentication and mongoose.  I hosted the site on Heroku at this url.  The API documents can be found at this url (however you will need the proper credentials to access this information).  Once this was setup, I built Swift code to test making GET, POST, and DELETE requests.  Figure 1 shows a small app I made to test out these requests with my API.  This program uses the Alamofire and SwiftyJSON cocoapods. 

After I verified that the backend part of my application was working I turned my focus to the app UI.  I began by building the UI/UX design in Adobe Xd.  Figure 2 shows the interaction for when the user wants to start a work session.  I plan to continue building out these app flows focusing on micro interactions that will make the app smooth and cohesive.  Next, I will begin to build the app with Swift as I simultaneously narrow down the UI/UX design. 

Week 9: Non-Visual Interfaces

After developing my final project idea, I used this weeks assignment to brainstorm a concept for the project.  I created an app that detects the sound level in the room around you.  The idea is that you would open the app when you start a work session.  The app would detect the sound level in the room around you.  When the sound gets too loud the app warns you that it is getting too loud and you might want to consider changing work locations (Figure 1).

I tried to integrate a haptic feature when the sound level gets too high.  This way the phone would vibrate and indicate to the user that this work spot is too loud to work productively.  Unfortunately I could not get this vibration feature to activate at this stage.  Ultimately I think this weeks assignment was a good experiment for my final project.

Link to code 

To view my final project proposal visit this link.

Week 6&7: Musical Chairs Game

This week I worked with Ada Jiang to create a p2p multi peer service musical chairs game.  We first developed a storyboard plan for the app (Figure 1) and a data model.

Figure 1

Figure 1

Then we worked together to get the peer to peer service working the way we wanted using two main view controllers.  Eventually we divided up the work.  I worked on the introduction view controller which consisted of the instructions carousel while Ada worked on the game view controller.  Once we had both controllers working we worked though small bugs together and finalized the UI to make the app look consistent.  Figure 2 shows the app working on four devices.

Week 5: Game Controller

This week I experimented with cocoa pods in order to connect my app to a server using web sockets.  I created a game controller that uses gesture swipes to control your character.  Figure 1 shows the app functionality.

Figure 2 shows the app controlling a characterizer in a game.  The code for this project can be found here

Week 4: Animated Tracking App

This week I updated my waste tracking app and included animations to improve the user experience.  Since my project from last week had some flaws I decided to start over and create a brand new Xcode project.  As instructed, I refrained from using the storyboard this week and wrote every element (except the view controllers) programmatically.  I found this to be very helpful.  Last week I think I got lost since I was going back and forth between my code and the storyboard.  Creating everything programmatically helped me to understand each step of the way.  I definitely prefer to program for the most part in code. The documents for this project can be found here and Figure 1 shows the final result. 

Figure 1

Week 3: Tracking App

This week I built an app that could help someone track their kitchen waste.  Each time someone took out the trash, plastic recycling, paper recycling, or compost they could add that event into the app log.  The app would have three main view controllers a data view, a graph view and an action view (Figure 1).  The data view would use a table view to list all the data entries with the type of waste that was put out and the time and day.  The graph view would show the data in a different way.  Each type of data would have a dedicated cell and the bar would grow based on how many total entries were logged for that waste type.  Finally the action view is where the user would add a new data point.  The user can toggle the waste types on or off depending on what type of data they wanted to log.

Figure 1

Figure 1

From there I designed a data model that made sense to me (Figure 2).

Figure 2

Figure 2

Once I began to build the app, I had some difficulties using the method laid out for us and my data model so I needed to simplify the model to the one shown in Figure 3.

Figure 3

Figure 3

I realize that this is not the best model to use, but I could not figure out how to send the data from the action view to the table views using the original model.  I also had trouble figuring out how to send data from the action model to both of the table views so I ended up not fully coding the graph view section.  Figure 4 shows my final result.

Because of the difficulties I had along the way, the app became a bit of a mess.  I definitely think that now that I understand the process a little more, I would be able to use my original data model to create a cleaner app.  The learning curve to understand how the data is transferred from the action view to the table view definitely got in the way of my process a bit and caused me to make changes based on what I understood.  Also, the way that the examples were given to us, it seemed like all of the information in the data model needed to be passed through the action view.  I am wondering how you could provide some information to the data model using the action view and set some through the table views.  Once I better understand how the data is passed between views, I think that I could recreate this app in a more complete way.

To view the code, visit this link

Week 2: Unlock App

This week I began to learn more about the swift language and the general Xcode workflow.  I began by following the autoLayout lab to get a hang of the constraint features in Xcode (Figure 1). 

Figure 1

Then I developed my own lock screen application based on phone brightness.  The application uses the brightness value on the users phone to determine if the phone can be unlocked or not.  If the brightness is at 100% the phone can be unlocked.  If the phone is at any value lower than 100%, the app will not unlock.  Figure 2 shows an example of someone successfully unlocking the app.

Figure 2

Figure 3 shows an example of someone failing to unlock the screen three times and on the fourth time the app tells them that they failed and they need to restart.

Figure 3

One feature that I struggled with was changing the button label based on the brightness.  When the user changes the brightness in the control center, I wanted the button label to dynamically change.  Since there is not loop function in swift, I struggled to figure out how to make this work. 

Click here for links to the code. 

Update: After office hours I was able to add the dynamic button feature. Figure 4 shows this adjustment.

Figure 4

Week 1: One Button App

This week I completed the Hello Xcode lab and created a one button app.  First, to familiarize myself with Xcode I created a one button app with a single label (Figure 1).

Figure 1 : Hello Xcode Lab

Figure 1: Hello Xcode Lab

I used this app to practice transferring the app onto my mobile device (Figure 2).

Figure 2 : Transferring app to mobile device

Figure 2: Transferring app to mobile device

I then began to build a constellation one button app.  The idea for the app is for the user to click on a light up star on the screen.  When they click the app will light up the next star for the user to click on.  Each time the user clicks, the current star connects to the previous stars, eventually showing The Big Dipper.  I began by using illustrator to build the background images for the app (Figure 3).

Figure 3:  Constellation App background development

Figure 3: Constellation App background development

I then added these images and some sounds/labels to create the finished product (Figure 4).

Figure 4: Constellation App

I imagine this to be a larger constellation app where the user points their mobile camera at the sky.  The app would find a constellation and the user would be guided to trace it with their finger. 

The files for this week can be found here