Week 5 – Introduction to variety of inputs (XR & different haptics)

Civilisations AR (Android, iOS)
  1. Is it possible to use the app in the living room, or will they need an open space for that? How is communicated that to the users.
    • It can be used in the room or living room, including lying in bed, and the screen displays the text of indications to guide the user, also using the line to indicate the user to looking for a certain area.
  2. Hardware capabilities
    • Camera, only available for Applies iPhone 6S and above
  3. Sometimes users can get too immersed in an AR experience, so they ignore physical objects around them. As a result, they can bump into objects or people. To prevent such behavior, is there a reminder in the app about that?
    • Nope
  4. Do they guide users to move backward or forward?
    • No, there may be guidelines for placing the artefacts on a flat surface or displaying the digital artefacts with a line guiding the user to look where it is.
  5. to prevent causing fatigue, the sessions should short, does this app has added periods of downtime to help user get relaxed?
    • Nope
  6. How do they gather data from the user? Do they have a sensor or a camera?
    • Using a camera to check the surface and place the artefact.
  7. Is the UI cluttered?
    • It’s pretty clear and tells you how to use the apps straight away.
  8. Does it include sound?
    • Yes, when the user clicks on the artefacts, it has a sound button to read out some text.
  9. How realistic do the objects look?
    • The virtualisation of the artefacts is a kind of realistic.
  10. Do they user familiar patterns? For example, swipe and tap?
    • Yes. Interactive pages using tap and swipes.
  11. How do the users learn to interact with the screen? Is it guided?
    • Yes, it has clear guidelines to show the user how to interact by using apps.
IKEA place (IOS, Android)
  1. Is it possible to use the app in the living room, or will they need an open space for that? How is communicated that to the users.
    • It can be used in the room or living room with the View in Room button.
  2. Hardware capabilities
    • Camera only
  3. Sometimes users can get too immersed in an AR experience, so they ignore physical objects around them. As a result, they can bump into objects or people. To prevent such behavior, is there a reminder in the app about that?
    • Nope
  4. Do they guide users to move backward or forward?
    • No, doesn’t show any of the guidelines.
  5. to prevent causing fatigue, the sessions should short, does this app has added periods of downtime to help user get relaxed?
    • Nope
  6. How do they gather data from the user? Do they have a sensor or a camera?
    • using camera
  7. Is the UI cluttered?
    • It’s clear and tells you to use VR view, but it is only available for some items.
  8. Does it include sound?
    • Nope
  9. How realistic do the objects look?
    • The virtualisation of the product is a kind of realistic.
  10. Do they user familiar patterns? For example, swipe and tap?
    • Yes. VR pages using tap but not swipe.
  11. How do the users learn to interact with the screen? Is it guided?
    • Yes, it has clear guidelines to show the user how to use the VR, but without any guidelines on the VR page.
JigSpace(IOS, Android)
  1. Is it possible to use the app in the living room, or will they need an open space for that? How is communicated that to the users.
    • It can be used in the room or living room or lying in the bed with a play/view button to communicate that to the user.
  2. Hardware capabilities
    • Camera and some production options, sound
  3. Sometimes users can get too immersed in an AR experience, so they ignore physical objects around them. As a result, they can bump into objects or people. To prevent such behavior, is there a reminder in the app about that?
    • Nope
  4. Do they guide users to move backward or forward?
    • No, doesn’t show any of the guidelines for this.
  5. to prevent causing fatigue, the sessions should short, does this app has added periods of downtime to help user get relaxed?
    • Nope
  6. How do they gather data from the user? Do they have a sensor or a camera?
    • Use the camera and some options, or add a button to create your own 3D product.
  7. Is the UI cluttered?
    • It’s clear and tells you how to use each function button.
  8. Does it include sound?
    • Yes, but the sound elements need to be supplied or played by user.
  9. How realistic do the objects look?
    • The virtualisation of the product is a kind of realistic.
  10. Do they user familiar patterns? For example, swipe and tap?
    • Yes. VR pages using tap but not swipe.
  11. How do the users learn to interact with the screen? Is it guided?
    • Yes, it’s very clear, and there are instructional instructions and videos to guide the user when they first start using the software.         

Week 3 – The user-centred approach and prototyping

Weekly Task: You have recently been hired as a user experience designer for a new product: The product is planned to launch with augmented reality (AR) feature for indoor “wayfinding” inside large buildings and on company or educational campuses. The feature is meant to augment or replace traditional internal signage and assist people in finding their way inside buildings they are unfamiliar with.

Main interface:

  1. This is the interface I’ve set up. The main interface has a registration feature, allowing users to create private accounts or sign in.”

Home Page:

  1. On this Home page, there is a VR navigation feature. Users can choose between buildings or campuses, and I also want to add an icon for hospitals. This is for the system to accurately determine the location and where the user wants to go.
  2. Users can input the desired address either by voice or by typing. The address input has a location feature that can determine the current location based on GPS.
  3. The central part displays a 3D interior view of the building to inform users of their current location and where they want to go, specifying the floor and location.
  4. At the bottom, there’s an AR feature where users can activate their cameras. Based on the terrain, a 3D character provides voice or text directions.

Recent Page:

  1. In this ‘Recent’ page, the main purpose is to provide users with a history of their search locations. This allows users to directly use this record the next time they use the system, eliminating the need to re-enter the address. Additionally, based on the user’s current location, suggestions are provided to help users quickly find their desired destination.

User Page:

In this ‘User’ page mainly aims to allow users to customize their own account. Features inside include:

  1. setting a profile picture,
  2. changing the password
  3. Select navigation voices such as a female or male voice, a cartoon character and a favourite start voice, etc.
  4. Users can also choose a character from their favourite celebrities or teams.
  5. There’s a ‘define community’ section where users can support certain groups like the LGBTQ community, feminist community, environmental protection community, etc. This is designed so users can add related elements when customising their characters.
  6. Styles can be chosen from rock, student, cartoon, or professional styles.
  7. This also includes religious elements for users who wish to add related elements.
  8. Additionally, the character’s appearance and clothing can all be customized.”

Setting:

In the ‘Setting’ section, users can configure some fixed options such as:

  1. country
  2. language
  3. interface color.
  4. There’s also a feature to set emergency contact details options related to system location.
  5. A section for frequently asked questions and a smart customer service system.
  6. Users can also upload feedback, file complaints about the software, and even delete their accounts.
  7. All these features are provided to give users freedom when using this software.
  8. Additionally, there’s an option to send emergency messages to notify specific contacts of one’s exact location. This feature is designed for assistance in cases of fires or other emergencies.

https://app.mockplus.com/app/aNq4Kjm0yp/comment/72zXvGtav_/PD3hW2IC95\

Week 2 – Cognition, Memory and Personas

Weekly Task:

Q: Can you give examples of external representations at the interface that reduce memory load and facilitate computational offloading?

A: for historical or event-based data in the museum, an interactive timeline can visually represent the sequence and duration of historical events, eliminating the need for users to remember them.

Q: Do you think that new technologies such as Augmented Reality and speech recognition systems have reduced people’s ability to remember? If yes, in which ways?

A: I think the technologies can reduce people’s ability to remember, such as using AR apps for navigation, which may reduce our need to remember routes or landmarks. Similarly, asking iPhone Siri to call someone might reduce the necessity to remember phone numbers from memory.

Q: Can you give us an example of an everyday activity (that you do, so you are the target user group), and it can be done using a different interface?

A: Smart Mirror with voice recognition by voice commands when showering. “Play my morning playlist” or “Play relaxing music” to start the music.

Week 1 – Scope of HCI and brief history

Weekly Task:

I have chosen ‘Interactive exhibitions in museums‘ as my topic. Through my research, I discovered that HCI in museums or galleries has gained significant traction in recent years (Mery Keitel, 2012), redefining the visitor experience. These exhibitions are not just limited to touch screens or audio guides. Instead, they encompass a broad spectrum of technologies, ranging from augmented reality (AR) to virtual reality (VR) and from gesture-based controls to responsive environments. By using those technologies, museums can offer visitors more interactive, educational, and immersive experiences(Morse et al., 2023).

In my finding, the Cleveland Museum of Art has departed from the traditional museum mode of directly viewing art pieces. Instead, museums adopted an interactive exhibition approach. The museum has established the largest multi-touch screen in the United States, presenting 4,100 significant art pieces from its collection on a 12.2-meter-long electronic screen, much like postcards. Detailed information about that piece is displayed when you touch the screen to select an artwork. Simultaneously, similar pieces also automatically appear around it. The images on the screen can be freely zoomed in and out, allowing viewers to see the minute details of the artwork. Through interactions between viewers and the collection, as well as interactions among viewers, the museum emphasizes interactivity as a key concept in modern museology. By integrating the latest technology, they promote effective dissemination and promotion of the museum’s exhibition and educational functions.

https://www.smithsonianmag.com/innovation/cleveland-museum-art-wants-you-to-play-with-its-art-180968007/

References:

Mery Keitel, A.S., 2012. Human computer interaction in museums as public spaces: A research of the impact of interactive technologies on visitors’ experience (Doctoral dissertation).

Morse, C., Niess, J., Bongard-Blanchy, K., Rivas, S., Lallemand, C. and Koenig, V., 2023. Impressions that last: representing the meaningful museum experience. Behaviour & Information Technology42(8), pp.1127-1154.